Expectation-Propogation for the Generative Aspect Model

نویسندگان

  • Tom Minka
  • John D. Lafferty
چکیده

The generative aspect model is an extension of the multinomial model for text that allows word probabilities to vary stochastically across docu­ ments. Previous results with aspect models have been promising, but hindered by the computa­ tional difficulty of carrying out inference and learning. This paper demonstrates that the sim­ ple variational methods of Blei et a!. (200 I) can lead to inaccurate inferences and biased learning for the generative aspect model. We develop an alternative approach that leads to higher accuracy at comparable cost. An extension of Expectation­ Propagation is used for inference and then em­ bedded in an EM algorithm for learning. Exper­ imental results are presented for both synthetic and real data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modelling Smooth Paths Using Gaussian Processes

A generative model based on the gaussian mixture model and gaussian processes is presented in this paper. Typical motion paths are learnt and then used for motion prediction using this model. The principal novel aspect of this approach is the modelling of paths using gaussian processes. It allows the representation of smooth trajectories and avoids discretization problems found in most existing...

متن کامل

Leveraging Multi-aspect Time-related Influence in Location Recommendation

Point-Of-Interest (POI) recommendation aims to mine a user’s visiting history and find her/his potentially preferred places. Although location recommendation methods have been studied and improved pervasively, the challenges w.r.t employing various influences including temporal aspect still remain. Inspired by the fact that time includes numerous granular slots (e.g. minute, hour, day, week and...

متن کامل

Expectation Propogation for Approximate Inference in Dynamic Bayesian Networks

We describe expectation propagation for ap­ proximate inference in dynamic Bayesian net­ works as a natural extension of Pearl's ex­ act belief propagation. Expectation propa­ gation is a greedy algorithm, converges in many practical cases, but not always. We de­ rive a double-loop algorithm, guaranteed to converge to a local minimum of a Bethe free energy. Furthermore, we show that stable fixe...

متن کامل

Expectation Truncation and the Benefits of Preselection In Training Generative Models

We show how a preselection of hidden variables can be used to efficiently train generative models with binary hidden variables. The approach is based on Expectation Maximization (EM) and uses an efficiently computable approximation to the sufficient statistics of a given model. The computational cost to compute the sufficient statistics is strongly reduced by selecting, for each data point, the...

متن کامل

A Generative Parser with a Discriminative Recognition Algorithm

Generative models defining joint distributions over parse trees and sentences are useful for parsing and language modeling, but impose restrictions on the scope of features and are often outperformed by discriminative models. We propose a framework for parsing and language modeling which marries a generative model with a discriminative recognition model in an encoder-decoder setting. We provide...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002